A Quick Tour through State Space Reconstruction

This tutorial presents the general idea of attractor reconstruction with synthetic data, and then we propose exercises with simulated and real data analysis. For these activities, you will need the most recent version of R and the rEDM package installed in your work computer. To start, open R and load the package by typing the following code in the R console:

Before you proceed please read through the essential concepts by reading the introduction and the first section (“Empirical Dynamic Modelling”) of rEDM’s tutorial that you can find here or in your local R installation with the command

A simple case: harmonic oscillator

Let’s get the grips with R language and to figure out how to reconstruct a state space with only one time series and its lags. For the first example, we use a very well-known system and its representation on state space, the harmonic oscillator:

\(\frac{dx}{dt}=y\)

\(\frac{dy}{dt}=-x\)

which has the solution \(x(t)=A \sin(t), \ y(t)=A \cos(t)\), so we can draw the state space of this system by laying down in two Cartesian axis its two state variables, \(x(t)\) and \(y(t)\). The attractors of this system are circles, that is, all possible trajectories form a circle in the state space:

And here are the time series for the two state variables:

A reconstruction of the state space

Now suppose that all the information you have about this system is a time series of one of the state variables, say \(x\). Because \(y(t)\) is missing, we are not able to construct a state space as we saw above, but by using the result of Takens theorem we can construct a shadow manifold that recovers the information of the original state space. This reconstruction is not a copy of the original attractor, but it is a valid topological reconstruction of the original, a “shadow”.

To do this, we generate new variables by lagging the time series by multiples of a time interval \(\tau\). In this case we will use the interval between observations (\(\tau=0.2\) time units). The R commands below create a data table object (which is called a data frame in R) with the original time series and its lagged version. The handy function make_block by the rEDM authors does this job. Use this command to download the function code and load it in your R workspace (it will be included in rEDM itself in future releases):

And now you can create a lagged variable data frame with:

A quick look at the first and last lines of the data frame can help to understand the lagged variable structure:

##   time         x       x_1
## 1  0.0 0.0000000        NA
## 2  0.2 0.1986693 0.0000000
## 3  0.4 0.3894183 0.1986693
## 4  0.6 0.5646425 0.3894183
## 5  0.8 0.7173561 0.5646425
## 6  1.0 0.8414710 0.7173561
##       time           x         x_1
## 2496 499.0  0.49099533  0.65428133
## 2497 499.2  0.30813490  0.49099533
## 2498 499.4  0.11299011  0.30813490
## 2499 499.6 -0.08665925  0.11299011
## 2500 499.8 -0.28285377 -0.08665925
## 2501 500.0 -0.46777181 -0.28285377

By plotting the time series as a function of their lagged versions we get the shadow manifold, which is similar to the original one:

There is a minimum number of dimensions to build a valid reconstruction of the original attractor. In this simple case we obviously need two dimensions. Takens’ Theorem guarantees that the number of dimensions is up to 2D + 1, where D is the dimensionality of the attractor. Finding the optimal dimension for the shadow manifold is part of a vast topic called Embedology (Sauer et al. 1991). The package rEDM has some methods to find the optimal dimensions of the manifold. We will learn more in the following tutorials of this course.

Noisy data

How robust is the attractor reconstruction to random noise? There are two main types of noise or error in data, which we call process errors and observational errors. The first one is about random changes of the state variables, like environmental fluctuations that affect the dynamic. The second type of error is related to measurement error in the data, for instance when we miss individuals of some species when counting them, or have limited precision (inherent to all measurement). Observational errors do not affect the dynamics, only the portrait we make of it.

In this section, we check how robust the attractor is to measurement error by adding simulated noise to the original time series. Use the commands below to simulate that the time series has independent measurement errors with constant variance. To do this, we add to each observation a value drawn from a Gaussian distribution with zero mean and standard deviation which is 1/12 of the standard deviation of the time series itself.

We thus proceed by creating a new data frame with the lagged variables and by plotting the new shadow manifold:

In this case the shadow manifold still looks like the original shape of the attractor, despite the noise caused by measurement errors.

Increasing the noise

Measurement errors make attractors more complex by spreading points (and thus trajectories) over the state space. Hence, the trajectories may cross in the shadow manifold, which breaks down the one-to-one mapping to the original attractor. Of course this effect gets larger as we increase the measurement error. We now will investigate a bit more how this affects the attractor reconstruction. The commands below double the measurement error of the time series and plots the shadow manifold:

The attractor reconstruction looks worse now, but we can overcome this problem by adding more dimensions to the shadow manifold. By doing this we unfold the shadow manifold, by unfolding crossing trajectories. We do this adding more axes, that correspond to the time series lagged by the further time steps (\(t + \tau, \ t + 2\tau, \ t + 3\tau\)…). Even the if the original manifold has a lower dimension it is valid to reconstruct the state space with more dimensions (Sauer et al. 1991).

You can check with the interactive plot below how an additional dimension unfolds a bit more the reconstructed attractor (it won’t run on your computer without the plotly library). In the following days we will learn some methods to find the optimal dimensions of the manifold. By now the take-home message is: the more complex the attractor, or the larger the error (observational or not), the more dimensions you will need to make a good reconstruction.

Problems

What is the effect of \(\tau\) ?

The commands below simulate a new time series of the bi-dimensional harmonic oscillator without measurement error but with a much smaller interval between observations (\(\tau=0.01\) time units).

This change in \(\tau\) affects the shape of the reconstruction:

How do you explain that? Which practical issues does this fact bring and how to solve them?

Hints

1. The command below adds a line \(x(t) = x(t + tau)\) to the manifold, which can help you to understand the problem with small values of \(\tau\):

2. Check this site: (http://www.physics.emory.edu/faculty/weeks//research/tseries4.html)

Attractor reconstruction

In this exercise you have to reconstruct the state space of a time series in this, checking if a three-dimensional manifold improve the reconstruction. To load the data in R run the command:

To take a look in the time series plot run:

Hints

1. Keep in mind the insight from the previous problem: the value of \(\tau\) change the shape of the shadow manifold.

2. To create a simple 3D plot in R you can use the scatterplot3d package. The example below does this for one of the noisy time series we examined above:

If the package is not available in your local R installation you can install it with:

Learn more

  • Sugihara, G., May, R., Ye, H., Hsieh, C.H., Deyle, E., Fogarty, M. and Munch, S., 2012. Detecting causality in complex ecosystems. Science, 338(6106), pp.496-500.
  • DeAngelis, D.L. and Yurek, S., 2015. Equation-free modeling unravels the behavior of complex ecological systems. Proceedings of the National Academy of Sciences, 112(13), pp.3856-3857.
  • Takens Theorem, a video by Sugihara’s Lab.
  • Takens, Floris. Detecting strange attractors in turbulence. Lecture notes in mathematics 898.1 (1981): 366-381.
  • Deyle, E.R. and Sugihara, G., 2011. Generalized theorems for nonlinear state space reconstruction. PLoS One, 6(3), p.e18295.
  • Sauer, T., Yorke, J.A. and Casdagli, M., 1991. Embedology. Journal of statistical Physics, 65(3), pp.579-616.

Brenno Cabella, Paulo Inácio Prado, Renato Coutinho, Marina Rillo, Rafael Lopes, Roberto Kraenkel

ICTP SAIFR, School on Physics Applications in Biology, January 2018